Disambiguation of Pattern Sequences with Recurrent Networks

نویسندگان

  • Ali A. Minai
  • Geoffrey L. Barrows
  • William B. Levy
چکیده

Recently, there has been great interest in using neural networks to learn sequences of patterns. This is obviously very important from a cognitive point of view. In this paper, we show how a simple network with a hippocampal-like structure can be used to learn complex stimulus sequences. Specifically, we train the system, which has a one-step recurrent dynamics, to disambiguate sequences with temporal overlaps of more than one step. Usually, this is done either through delay lines or by means of capacitive effects. We show that a significant population of unforced, recurrently activated neurons in the system can enable the system to disambiguate quite well over sev eral time steps.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays

In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...

متن کامل

Word embeddings and recurrent neural networks based on Long-Short Term Memory nodes in supervised biomedical word sense disambiguation

Word sense disambiguation helps identifying the proper sense of ambiguous words in text. With large terminologies such as the UMLS Metathesaurus ambiguities appear and highly effective disambiguation methods are required. Supervised learning algorithm methods are used as one of the approaches to perform disambiguation. Features extracted from the context of an ambiguous word are used to identif...

متن کامل

Interactive Language Understanding with Multiple Timescale Recurrent Neural Networks

Natural language processing in the human brain is complex and dynamic. Models for understanding, how the brain’s architecture acquires language, need to take into account the temporal dynamics of verbal utterances as well as of action and visual embodied perception. We propose an architecture based on three Multiple Timescale Recurrent Neural Networks (MTRNNs) interlinked in a cell assembly tha...

متن کامل

Part-of-Speech Tagging with Recurrent Neural Networks

This paper explores the use of discrete-time recurrent neural networks for part-of-speech disambiguation of textual corpora. Our approach does not need a handtagged text for training the tagger, being probably the first neural approach doing so. Preliminary results show that the performance of this approach is, at least, similar to that of a standard hidden Markov model trained using the Baum-W...

متن کامل

Online tool wear monitoring in turning using time-delay neural networks

Wear monitoring systems often use neural networks for a sensor fusion with multiple input patterns. Systems for a continuous online supervision of wear have to process pattern sequences. Therefore recurrent neural networks have been investigated in the past. However, in most cases where only noisy input or even noisy output patterns are available for a supervised learning, success is not forthc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006